Remote Data Jobs · Delta Lake

Job listings

  • Design and implement data pipelines using Databricks, PySpark, and Delta Lake.
  • Work closely with business stakeholders and analysts to understand KPIs.
  • Model and structure data using dimensional modeling techniques.

Clear Tech specializes in Data, Analytics, and Artificial Intelligence, helping companies around the world transform their data into real business value. Our team combines highly skilled talent in Latin America with global best practices across cloud technologies and delivers end-to-end projects.

  • Architect and implement Databricks Lakehouse solutions for large-scale data platforms.
  • Design and optimize batch & streaming data pipelines using Apache Spark (PySpark/SQL).
  • Implement Delta Lake best practices (ACID, schema enforcement, time travel, performance tuning).

They are looking for a Databricks Architect to design and lead modern Lakehouse data platforms using Databricks. The role focuses on building scalable, high-performance data pipelines and enabling analytics and AI use cases on cloud-native data platforms.

  • Design and evolve the enterprise Azure Lakehouse architecture.
  • Lead the transformation of classic Data Warehouse environments into modern Lakehouse.
  • Define and implement architecture principles, standards, patterns, and best practices for data engineering and analytics platforms.

Deutsche Telekom IT Solutions Slovakia, formerly T-Systems Slovakia, has been a part of the Košice region since 2006. They have grown to be the second-largest employer in eastern Slovakia, with over 3900 employees, aiming to provide innovative information and communication technology services.